Monte Carlo Theory as an Explanation of Bagging and Boosting

نویسندگان

  • Roberto Esposito
  • Lorenza Saitta
چکیده

In this paper we propose the framework of Monte Carlo algorithms as a useful one to analyze ensemble learning. In particular, this framework allows one to guess when bagging wil l be useful, explains why increasing the margin improves performances, and suggests a new way of performing ensemble learning and error estimation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Importance Sampled Learning Ensembles

Learning a function of many arguments is viewed from the perspective of high– dimensional numerical quadrature. It is shown that many of the popular ensemble learning procedures can be cast in this framework. In particular randomized methods, including bagging and random forests, are seen to correspond to random Monte Carlo integration methods each based on particular importance sampling strate...

متن کامل

Improving reservoir rock classification in heterogeneous carbonates using boosting and bagging strategies: A case study of early Triassic carbonates of coastal Fars, south Iran

An accurate reservoir characterization is a crucial task for the development of quantitative geological models and reservoir simulation. In the present research work, a novel view is presented on the reservoir characterization using the advantages of thin section image analysis and intelligent classification algorithms. The proposed methodology comprises three main steps. First, four classes of...

متن کامل

Explaining Bagging with Monte Carlo Theory

In this paper we propose the use of the framework of Monte Carlo stochastic algorithms to analyze ensemble learning, specifically, bagging. In particular, this framework allows one to explain bagging’s behavior and also why increasing the margin improves performances. Experimental results support the theoretical analysis.

متن کامل

A Case Study on Bagging, Boosting, and Basic Ensembles of Neural Networks for OCR

W e study the effectiveness of three neural network ensembles in improving OCR performance: ( i ) Basic, (ii) Bagging, and (iii) Boosting. Three random character degradation models are introduced in training indivadual networks in order to reduce error correlation between individual networks and to improve the generalization ability of neural networks. We compare the recognition accuracies of t...

متن کامل

On Bagging and Estimation in Multivariate Mixtures

Two bagging approaches, say 1 2 n-out-of-n without replacement (subagging) and n-out-of-n with replacement (bagging) have been applied in the problem of estimation of the parameters in a multivariate mixture model. It has been observed by Monte Carlo simulations and a real data example, that both bagging methods have improved the standard deviation of the maximum likelihood estimator of the mix...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003